The VC-Dimension of Similarity Hypotheses Spaces

نویسندگان

  • Mark Herbster
  • Paul Rubenstein
  • James Townsend
چکیده

Given a set X and a function h : X −→ {0, 1} which labels each element of X with either 0 or 1, we may define a function h to measure the similarity of pairs of points in X according to h. Specifically, for h ∈ {0, 1} we define h ∈ {0, 1} by h(w, x) := 1[h(w) = h(x)]. This idea can be extended to a set of functions, or hypothesis space H ⊆ {0, 1} by defining a similarity hypothesis space H := {h : h ∈ H}. We show that vc-dimension(H) ∈ Θ(vc-dimension(H)).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Faithful Embedding in Learning

In this paper we consider the problem of embedding the input and hypotheses of boolean function classes in other classes, such that the natural metric structure of the two spaces is approximately preserved. We rst prove some general properties of such embedding and then suggest and discuss possible approximate embedding in the class of \half-spaces" (single layer perceptrons) with dimension pol...

متن کامل

COS 511 : Theoretical Machine Learning

Last class, we discussed an analogue for Occam’s Razor for infinite hypothesis spaces that, in conjunction with VC-dimension, reduced the problem of finding a good PAClearning algorithm to the problem of computing the VC-dimension of a given hypothesis space. Recall that VC-dimesion is defined using the notion of a shattered set, i.e. a subset S of the domain such that ΠH(S) = 2 |S|. In this le...

متن کامل

Learning Multiple Tasks using Shared Hypotheses

In this work we consider a setting where we have a very large number of related tasks with few examples from each individual task. Rather than either learning each task individually (and having a large generalization error) or learning all the tasks together using a single hypothesis (and suffering a potentially large inherent error), we consider learning a small pool of shared hypotheses. Each...

متن کامل

Limitations of Learning via Embeddings in Euclidean Half-Spaces

The notion of embedding a class of dichotomies in a class of linear half spaces is central to the support vector machines paradigm. We examine the question of determining the minimal Euclidean dimension and the maximal margin that can be obtained when the embedded class has a finite VC dimension. We show that an overwhelming majority of the family of finite concept classes of any constant VC di...

متن کامل

Limitations of Learning Via Embeddings

This paper considers the embeddability of general concept classes in Euclidean half spaces. By embedding in half spaces we refer to a mapping from some concept class to half spaces so that the labeling given to points in the instance space is retained. The existence of an embedding for some class may be used to learn it using an algorithm for the class it is embedded into. The Support Vector Ma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1502.07143  شماره 

صفحات  -

تاریخ انتشار 2015